# Low Perplexity Model
Gerpt2 Large
MIT
GerPT2 is the large-scale version of the German GPT2, trained on the CC-100 corpus and German Wikipedia, excelling in German text generation tasks.
Large Language Model German
G
benjamin
75
9
Gerpt2
MIT
GerPT2 is a large German language model based on the GPT2 architecture, trained on the CC-100 and German Wikipedia datasets, outperforming similar German GPT2 models.
Large Language Model German
G
benjamin
48
5
Featured Recommended AI Models